![]() System for the stereoscopic visualization of an object area.
专利摘要:
The invention relates to a system (10 (1)) for visualizing an object area (12), comprising an electronic image capture device (16), with an optical assembly (18) having a first optical channel (32) for the object area (12) a first sensor area (68) or a plurality of first sensor areas (68) of the image capture device (16) imaging a first imaging beam path (30) and a second optical channel (32 ') for one the object area (12) on a second sensor area (68') or more second sensor surfaces (68 ') of the image capture device (16) provides imaging second imaging beam path (30') and which contains a microscope main objective system (20), which is interspersed by the first imaging beam path (30) and the second imaging beam path (30 ') a first image generating device (74) for visualizing the object region (12) for a first observer (76), who has an image on the first sensor surface (6 8) or a plurality of first sensor surfaces (68) detected first image of the object region (12) and on the second sensor surface (68 ') or the plurality of second sensor surfaces (68') detected second image of the object region (12) can be fed, and with a second image generating device (74 ') for visualizing the object region (12) for a second observer (76'). For visualizing the object area (12), the second image generating device (74 ') can be supplied with an image of the object area (12) which contains at least one image section of the first image or of the second image. 公开号:CH712453A2 申请号:CH00564/17 申请日:2017-04-28 公开日:2017-11-15 发明作者:Regensburger Alois;Hauger Christoph 申请人:Zeiss Carl Meditec Ag; IPC主号:
专利说明:
Description: [0001] The invention relates to a system for visualizing an object area, having an electronic image capture device, with an optical subassembly having a first optical channel for a first imaging beam path imaging the object area on a first sensor area or a plurality of first sensor areas of the image capture device and a second optical channel for a second imaging beam path imaging the object area on a second sensor area or a plurality of second sensor areas of the image detection device and containing a microscope main objective system interspersed by the first imaging beam path and the second imaging beam path, with a first image generation device for visualizing the object area for a first observation person which comprises a first image of the object region detected on the first sensor surface or of the plurality of first sensor surfaces, and a first image of the object region detected on the first sensor surface or of the plurality of first sensor surfaces nsorfläche or the plurality of second sensor surfaces detected second image of the object area can be fed, and with a second image generating device for visualizing the object area for a second observer. Moreover, the invention relates to an object area visualization method in which a first image of the object area and a second image of the object area are detected in a first optical channel having a first imaging beam path and in a second optical channel having a second imaging optical path with optical axes Object area form a stereo angle a, and in which the object area of a first observer and a second observer is displayed. The invention also relates to a computer program product for providing image data on a computer unit for visualizing an object area using such a method. Such a system for the stereoscopic visualization of an object area is known from US Pat. No. 6,525,878 B1. This system contains two separate electronic image capture devices, each of which two stereoscopic partial beam paths, which enforce a common microscope main objective, is fed via beam splitter. The object of the invention is to provide a system and a method for visualizing an object area, in particular an operating area, by means of which the stereoscopic visualization of the object area is made possible with a multiplicity of different perspectives. This object is achieved by the system specified in claims 1 and 8 and the object area visualization method specified in claims 10 and 13. Advantageous embodiments of the invention are indicated in the dependent claims. In an inventive system for visualizing an object region of the second image generating means for visualizing the object area, an image of the object area can be fed, which contains at least one image section of the first image or the second image. An inventive system includes a position determining device for determining the position of the vertical projection of the line connecting the eyes of the second observer in the plane perpendicular to an optical axis of the microscope main objective system level. In the invention, determining the position of the perpendicular projection of the connecting line of the eyes of the second observer can be effected in particular by referencing the position of the perpendicular projection of the connecting line of the eyes of the second observer into the plane perpendicular to an optical axis of the microscope main objective system the position of the base body of the microscope unit to the image generating device by means of triangulation and image recognition by continuously evaluating the signals of an image sensor. Within the scope of the invention, however, it is also possible for the position of the perpendicular projection of the line of connection of the eyes of the second observer in the plane perpendicular to an optical axis of the microscope main objective system to be determined from a position of the second observer in relation to the microscope head. Main objective is related and the eg can be entered manually at an input device. A system according to the invention may include a computer unit having a computer program comprising an image extracting routine in which the image section corresponding to the determined position of the perpendicular projection of the connecting line of the eyes of the second observer from the detected first image of the object area or from the detected second image of the object area is determined. In this way, it is possible for a main observer to display the object area in the correct position and with a perspective corresponding to natural vision. It is advantageous if the computer program includes an evaluation routine for evaluating the determined by means of the position determining device position of the vertical projection of the line connecting the eyes of the second observer in the plane perpendicular to an optical axis of the microscope main objective system level, the second Image generating means supplies the detected first image and the detected second image for displaying a stereoscopic image of the object region when the position determined by the position determining device of the vertical projection of the line connecting the eyes of the second observer in the plane perpendicular to an optical axis of the microscope main objective system level is the plane defined by an optical axis of the first imaging beam path and by an optical axis of the second imaging beam path on the side facing away from the object area of the microscope main objective system plane parallel or to this plane is at an angle φ, for which applies: -15 ° <φ <15 ° or -10 ° <φ <10 ° or -5 ° <φ <5 °. In particular, the computer program may include a routine for extrapolating the at least one image section of the first image or the second image to an image that can be supplied to the second image generation device for visualizing the object region, which has an image format adapted to an image display of the image generation device. For this, e.g. an upscaling corresponding to an enlargement factor and an interpolation of the image supplied to the image generating device to a new image resolution. In addition, in the system, the routine for extrapolating the at least one image section of the first image or the second image may transfer the at least one image section to an image that can be supplied to the second image generation device for visualizing the object region that is outside the first image or the second image Image lying image areas has. A sensor surface of an image sensor in a system according to the invention may have a rectangular shape. In particular, a sensor surface of an image sensor in such a system may be square. In the present case, the sensor surface of an image sensor is understood to be the surface of the image sensor which is sensitive to the detection of light. The sensor surface of an image sensor formed as a CCD sensor may be e.g. be the area provided for the detection of light with the matrix of photosensitive photodiodes or a portion of this area. It should be noted that the shape of the sensor surface of an image sensor can be adjusted in particular also by selecting the areas of an image sensor used for detecting light. For example, For example, a portion of the area provided with the matrix of photosensitive photodiodes by which light from the object area is detected may be a square area. In the present case, the resolution of an image display is understood to mean a pixel resolution of the image display. The resolution of an image is also to be understood as pixel resolution, i. as the number of pixels that make up the image. A preferred embodiment of the invention provides that the spectral transmission of the first optical channel and the second optical channel for the light passing through the first optical channel and the light passing through the second optical channel is different. The system may include an image overlay stage for overlaying the first image and the second image to a monoscopic overlay image of the object region. In this case, the image overlay stage for supplying the monoscopic overlay image to the first image generation device can be operatively connected to the first image generation device. In particular, a system according to the invention may comprise a position determining device for determining the position of the perpendicular projection of the connecting line of the eyes of the second observer in the plane perpendicular to an optical axis of the microscope main objective system and containing a computing unit with a computer program which displays an image section. Detection routine contains, in which the image section corresponding to the determined position of the perpendicular projection of the line connecting the eyes of the second observer from the monoscopic overlay image of the object area is determined. It is advantageous if the computer unit for supplying the monoscopic overlay image to the second image generating device with the second image generating device is operatively connected. With a coupling device for coupling the movement of the at least one displaceable lens in the first optical system with the movement of the at least one displaceable lens in the second optical system and the movement of the at least one displaceable lens in the third optical system can be ensured in a system according to the invention will be that when the imaging scale in one optical channel varies, the imaging scale in the other two optical channels is the same. Since digital display devices are typically rectangular and in particular often have the aspect ratio 16: 9, according to the invention from interest areas on the sensor surface of the image sensor, i. on areas of the sensor surface with information of interest, a rectangular display area selected, which is then actually transmitted as an image signal to an observer. The inventors have found that it is advantageous if the display area on the image sensor has the same pixel resolution, e.g. 3840 x 2160 pixels, and the same aspect ratio as the electronic display for the observer's eye. According to the invention, it is therefore proposed that, when deviations in the pixel resolution of the display and the image sensor occur, a read-out region is subsequently adapted to the display by digital zooming with pixel interpolation and optionally by further trimming of the read-out region. It should be noted that the display area can in particular enclose a picture area as a circle in order to display the complete picture with dark edges in the display. However, it is also possible to design an image area as an inner circle in order to avoid dark areas on the edges of the display. In particular, it is advantageous if the electronic image capture device connected to the data separation stage image processing stage for the on the position of the vertical projection of the line connecting the eyes of a main observer and / or a secondary observer in the vertical to the optical axis of the main lens system level matched digital Rotate the image data has an image data group. Here it can be provided in particular that the image processing stage interpolates the brightness and / or the color of the image data of the first and / or second and / or third group of image data due to a predetermined brightness and / or color information for the image data. The invention also extends to an object area visualization method in which a first image of the object area and a second image of the object area in a first optical channel with a first imaging beam path and in a second optical channel with a second imaging beam path with optical axes detected which forms a stereo angle α in the object area and in which the object area of a first observer and a second observer is displayed, wherein the first observer for the stereoscopic visualization of the object area the first image and the second image is supplied, and wherein the visualization of the object area of the second observer is supplied with an image of the object area which contains at least one image section of the first image or of the second image. In addition, the invention also extends to an object area visualization method in which a first image of the object area and a second image of the object area in a first optical channel with a first imaging beam path and in a second optical channel with a second imaging beam path with optical Axes is detected, which form a stereo angle a in the object area, and in which the object area of a first observer and a second observer is displayed, wherein the spectral transmission of the first optical channel and the second optical channel for the first optical channel passing through light and the light passing through the second optical channel is different, the first observer being supplied with an overlaid image in the form of an overlay of the first image and the second image for the visualization of the object region, and wherein the second observation perspec n for the visualization of the object area an image detail of the superimposed image is displayed, which has an image parallel to the vertical projection of the line connecting the eyes of the second observer in the object area image edge. Furthermore, the invention extends to a computer program product for performing an object area visualization method given above by means of a computer to which the first image of the object area and the second image of the object area are supplied. In the following the invention will be explained in more detail with reference to the embodiments schematically illustrated in the drawing. Show it: 1 shows a first system for the visualization of an object area with a first imaging device for a first observer and with a second imaging device for a second observer; Fig. 2 shows a prism color splitter with image sensors in the system; 3 shows a projection of the sensor surfaces of image sensors in the system along the line III-III Fig. 1; 4A and 4B illustrate the acquisition and processing of image data in the first system for visualizing the object area with the first and second image generation means; 5A is a partial section of a second system for visualizing an object area with sensor surfaces of image sensors and a first and a second observer; Fig. 5B and 5C are further partial sections of the second system; FIGS. 6A and 6B illustrate the acquisition and processing of image data in a third, alternative system for visualizing the object area with the first and second imaging means; 7 shows a partial section of a fourth system for the visualization of an object area with Sensorflä surfaces of image sensors and a first and a second observation person. Fig. 8 is a partial section of a fifth system for visualizing an object area with Sensorflä chen of Image sensors and first and second observers; and 9 shows a partial section of a sixth system for the visualization of an object area with a first imaging device for a first observer and with a second imaging device for a second observer. The first system 10 (FIG. 1) for visualizing an object region 12 shown in FIG. 1 has a microscope unit 14 that contains an electronic image capture device 16 and has an adjustable optical assembly 18. The optical assembly 18 includes a microscope main objective system 20 having an optical axis 22 and a first optical system 24 and a second optical system 24 '. In each of the optical systems 24, 24 'there is a first and a second lens element 26, 28, which is designed as a cemented member and which can be displaced in a direction parallel to the optical axis 22 of the microscope main objective system 20. The first optical system 24 is penetrated in the microscope unit 14 in each case by the first mono-scopic imaging beam path 30. The second monoscopic imaging beam path 30 'extends through the second optical system 24' in the microscope unit 14. The first optical system 24 provides a first optical channel 32 for a first monoscopic imaging beam path 30, which images the object region 12 on first sensor surfaces of the image capture device 16. Accordingly, the second optical system 24 'defines a second optical channel 32' for a second monoscopic imaging beam path 30 'imaging the object region 12 on second sensor surfaces of the image capture device 16. The first and second monoscopic imaging beam paths 30, 30 'together form a stereoscopic imaging beam path, which enables stereoscopic visualization of the object area 12. The imaging beam paths 30, 30 'have optical axes 34, 34' which are parallel to one another on the side of the microscope main objective system 20 facing away from the object region 12 and which form the stereo angle α in the object plane 12 in the object plane 36. The two optical systems 24,24 'have an identical structure. Each optical system 24, 24 'contains a prism color splitter 40, 40' as an optical element interspersed by the corresponding monoscopic imaging beam path 30, 30 '. The optical elements of the optical systems 24, 24 'are each received in a holder 38, 38'. Fig. 2 shows the prism color splitter 40 with image sensors 42, 44 and 46 of the image capture device 16. The prism color splitter 40 includes a first isosceles right prism 48 and a right angle prism 50 having a base 52 adjacent to the side surface 54 of the isosceles right-angled prism 48 is connected. The prism color splitter 40 has a prism square 56 having a side surface 58 which abuts the side surface 60 of the isosceles right-angled prism 48. On the side surface 54 and the side surface 60 of the isosceles right-angled prism 48, a dielectric interference filter 62, 64 is arranged in each case. The dielectric interference filter 64 mirrors the blue light component of the first monoscopic imaging beam path 30, which strikes the base surface 66 of the prism square 56 along the optical axis 34, in the direction perpendicular to the optical axis 34 into a beam path with the optical axis 34 (FIG. 1). The green component and the red component of the light in the imaging beam path 30 enter the isosceles right-angled prism along the direction of the optical axis 34 through the interference filter 64. The interference filter 62 splits this light in the direction of the optical axis 34 through the rectangular prism 50 extending beam path with the red light portion and in a beam path with the optical axis 34 (2) whose direction is perpendicular to the direction of the optical axis 34. The image sensors 42, 44, 46 have first sensor surfaces 68, 70, 72, on which the adjustable optical assembly 18 respectively generates a first image of the object region 12. The prism color splitter 40 'of the second optical system 24' has a structure identical to the prism color splitter 40 of the first optical system 24. The prism color splitter 40 'guides the monoscopic imaging beam path 30' to image sensors having second sensor surfaces which correspond to the first sensor surfaces 68, 70, 72 of the image sensors 42, 44, 46. There, the monoscopic imaging beam path 30 'generates a second image of the object region 12 in each case. FIG. 1 shows the image sensor 42 with a first sensor surface 68 and the image sensor 42 'with a second sensor surface 68'. [0034] The system 10 (FIG. 1) for visualizing an object area 12 includes a first image generator 74 for displaying an image of the object area 12 for a first observer 76. There is also a second image generation device 74 'in the system 10, by means of which a further image of the object region 12 can be displayed for a second observation person 76'. The first image generating device 74 and the second image generating device 74 'are each designed as a so-called head mounted display (HMD), which allows the eyes 78, 80 and 78', 80 'of the first observer 76 and the second observer 76' by means of displays to display both monoscopic images and stereoscopic images. The microscope unit 14 with the image capture device 16 and the optical assembly 18 is received on a tripod 82. This stand 82 includes mechanisms for the compensation of load torques with motor drives and / or balance weights (not shown), which allows the first observer 76 the largely force-free setting of the observation direction of the microscope unit 14. The microscope unit 14 has handles 84 for this purpose. By moving the microscope unit 14 by means of the handles 84, the first observer 76 may e.g. Set the distance A of the microscope main objective system 20 from the object area 12 and the position of the optical axis 22 of the microscope main objective system 20 relative to the object area 12. The system 10 (FIG. 1) includes a computer 86 connected to the image capture device 16. In the computer 86, there is a computing unit 98, which stores the first image 88 of the object region 12 acquired at the first sensor surfaces 68, 70, 72 and the second sensor surfaces, such as e.g. the image 90 of the object region 12 acquired at the second sensor surface 68 'combines to form a stereoscopic image that can be displayed to the first observer 76 by means of the first image generation device 74. 3 shows a partial section of the system along the line III-III of FIG. 1 with a projection of the sensor surfaces 68, 70, 72 of the image sensor 42, 44, 46 and the sensor surfaces 68 ', 70', 72 ' of the image sensor 42 ', 44', 46 'in the plane of the microscope main objective system 20. The sensor surface 68 of the image sensor 42 and the sensor surface 68 'of the image sensor 42' each have a rectangular shape with an aspect ratio 16: 9. It should be noted that the sensor surfaces 68, 68 'of the image sensors 42, 42' basically have an arbitrary aspect ratio can have, for example also the aspect ratio 4: 3 or even 1: 1. The computing unit 98 reads out the first and second image sensors of the image capture device 16 and supplies the read image data to the image generation device 74. In a stereo image mode, the first observer 76 is thus presented with the image 88 of the object region 12 acquired by the first image sensors in the image capture device 16 and the image 90 of the object region 12 captured by the second image sensors in the image capture device 16 as a stereoscopic image that is from the image 88 is formed as a first stereoscopic partial image and of the image 90 as a second stereoscopic partial image. By adjusting the microscope unit 14 on the handles 84, the observer 76 in the system 10 (1) can ensure that the position of the vertical projection of the imaginary connecting line 92 of their eyes 78, 80 into the optical axis 22 of the microscope. Main lens system 20 is parallel to the perpendicular line to the distance line 94 of the optical axes 34, 34 * of the first monoscopic imaging beam path 30 and the second monoscopic imaging beam path 30 '. The first observer 76 is then visualized a stereoscopic image of the object area 12, which corresponds to a natural visual impression. The second imaging device 74 'includes a position measuring device 96 formed as a continuous measuring system for determining the position of the perpendicular projection of the connecting line 92' of the eyes 78 ', 80' of the second observer 76 'into an optical axis 22 of the microscope Main lens system 20 vertical plane by referencing the position of the body of the microscope unit 14 to the image generating means 74 'by means of triangulation and image recognition by evaluating the signals of an image sensor (not shown), which is integrated in the position determining device 96. The computer 86 shown in FIG. 1 in the system 10 (FIG. 1) has another arithmetic unit 98 '. The further arithmetic unit 98 'is used for reading out the first image sensors 42, 44 and 46 or the second image sensors 42', 44 'and 46' in the image acquisition device 16. The position determination device 96 transmits the detected position to a position calculation stage 100 in the computer 86. The position calculation stage 100 provides position information to the arithmetic unit 98 ', by means of which the arithmetic unit 98' generates image data from the image signals of the image sensors 42, 44, 46 or the image sensors 42 ', 44', 46 ', which form a monoscopic rectangular image 102 corresponding to the object region 12 having a base side 104 which leads to the position determined by the position determining device 96 of the perpendicular projection of the connecting line 92 'of the eyes 78', 80 'of the second observer 76' in the optical axis 22 of the microscope main objective system 20 vertical plane is parallel. FIGS. 4A and 4B show the acquisition and processing of image data in the system 10 (FIG. 1) for visualizing the object region 12 with the first image device 74, 74 'shown in FIG. 3 and the second image generation device 74'. , The image data of the image 88 captured by the image capture device 16 or the image 90 captured by the image capture device 16 are displayed to the left and right eyes 78 ', 80' of a second observer 76 'in the system 10 (FIG. 1) as a monoscopic image 102 of the object area 12, which is digitally windowed and rotated with respect to a partial image 88, 90. In the system 10 (FIG. 1), the second observer 76 'has the possibility to determine through which of the two optical channels 32, 32' the object area 12 is visualized by means of the image generation device 74 '. In this way, the system 10 (FIG. 1) allows for visualization of the object area 12 optimized for the second observer 76 '. it is possible for the second observation person 76 'to display an image 102 of the object region 12 which is captured with the imaging beam path 30, 30' and whose optical axis 34, 34 'projects into an operation channel. In addition, it is possible for the second observation person 76 'to display an image 102 of the object area 12 which is captured with the imaging beam path 30, 30' and is not shaded by surgical instruments. A modified embodiment of the system 10 (FIG. 1) provides that a switching of the optical channel 32, 32 'by which the image of the object area 12 is displayed to the second observer 76' is automatically dependent on the co-observer angle of view or as a function of a determined image brightness on the first and second sensor surfaces of the image sensors. In order to avoid irritation of the second observer 76 ', it is advantageous if a period of time lies between the automatic switching of the image displayed to the second observer 76' from the first optical channel 32 to the other optical channel 32 ', which is at least 5 minutes or preferably at least 10 minutes, or if this switching is performed only in the case of a change of position of the second observer 76 '. The first observer 76 is supplied with the stereoscopic partial images 88, 90 of the object region 12 detected by the image capturing device 16 via the computing unit 98 of the computer 86 shown in FIG. In this case, the arithmetic unit 98 can display to the first observer 76 all the image information supplied to the first and second sensor surfaces of the image sensors in the first and second imaging beam paths 30, 30 '. It should be noted that in a modified embodiment of the system 10 (FIG. 1) it may be provided that the image of the object region 12 displayed to the first observer 76 can be digitally zoomed by the computing unit 98 or, alternatively, the first observer 76 a digitally fenestrated and / or an interpolated image of the object area is displayed. This is e.g. is advantageous if the resolution of the image display or the format of the image display of the image generator 74 deviates from the resolution of the format for capturing image data of the image capture device 16. In order to display to the second observer 76 'a page-right image of the object region 12, an image 102 is extracted from the image 88 captured in the imaging beam path 30 by means of the image capture device 16 as a digitally windowed and rotated partial image. It is accepted that only a reduced field of view can be used for the display of this page-correct image of the object area and that the image 102 as a digitally windowed and rotated partial image of the image 88 has a reduced resolution compared to the image 88. The image 102 is then adjusted in the computing unit 98 'by digital zooming / interpolation to the resolution of a display in the display of the imaging device 74'. The image 102 is preferably arranged centrally in the image 88, which is a stereoscopic partial image. The orientation of the image 102 corresponds to the position of the perpendicular projection of the connecting line 92 'of the eyes 78', 80 'of the second observer 76' into the plane of the microscope main objective system 20. It should be noted that in a modified embodiment of the system 10 (FIG. 1), it may be provided that the attitude determination device 96 is not configured as a continuous measuring system but rather an input device (not shown) for manually inputting the position of the second observer 76 'in which this position with respect to the microscope unit 14 or the position of the first observer 76 can be entered. The attitude determination device 96 here determines the position of the perpendicular projection of the connecting line 92 'of the eyes 78', 80 'of the second observer 76' into the plane perpendicular to an optical axis 22 of the microscope main objective system 20, due to the position of the second manually entered at the input device Observant 76 ', eg In this case, an orientation of the image 102, and thus the image of the object area 12 displayed in the second observation person 76 'in the computer unit 98', is only changed in relation to the microscope unit 14 or with respect to the microscope main objective system 20 when a new, changed position is input to the input device of the attitude determination device 96. It should be noted that according to the perpendicular to the distance line 94 of the optical axes 34, 34 'of the first monoscopic imaging beam 30 and the second monoscopic imaging beam 30' viewing angle ß of the second observer 76 ', the maximum for the image 102 usable sensor surface changed. At the viewing angle β = 0 °, the second observer 76 'can in principle be supplied with all image data captured in an optical channel 32, 32'. At the viewing angle β = 90 ° or β = 270 °, the second observer 76 'receives only a section of this image data. [0054] It should also be noted that in the system 10 (FIG. 1) as an image generator for the first observer 76 and the second observer 76 ', instead of a head mounted display (HMD), a screen or other display device for displaying two or three-dimensional image information can be provided. FIGS. 5A, 5B and 5C respectively show partial sections of a second system 10 (FIG. 2) for visualizing an object area with a first image generator 74 for a first observer 76 and with a second image generator 74 'for a second observer 76 '. Insofar as assemblies and elements in FIGS. 5A, 5B and 5C correspond to assemblies and elements from the above-described figures, these are identified by the same numbers as reference symbols. Basically, the function and design of the second system 10 (FIG. 2) corresponds to visualizing an object region of the function and design of the first system 10 for visualizing an object region 12. Therefore, only the differences of the second system 10 (2) for visualizing an object area 12 to the first system 10 (1) for visualizing an object area 12 will be explained below. In the second system 10 for visualizing an object area, not only the first image forming means 74 but also the second image forming means 74 'allow the stereoscopic viewing of the subject area when the first observing person 76 and the second observing person 76' are side by side or each other are positioned opposite one another. The first observation person 76 is supplied with the stereoscopic partial images 88, 90 of the object region 12 acquired by means of the image capture device 16 via the computing unit 98. In principle, the full image information can be utilized here by the first observer 76. The second observer 76 'receives either a stereoscopic or a monoscopic display image from the imaging device 16 in accordance with its viewing direction detected by the orientation device 96. The position determining device 96 determines the position of the vertical projection of the connecting line 92 'of the eyes 78', 80 'of the second observer 76' in the plane perpendicular to an optical axis 22 of the microscope main objective system 20 level. If the angle of view β is β = 0 °, as shown in Fig. 5A, i. if the viewing angle for the first observer 76 and the viewing angle for the second observer 76 'are the same, then the imaging device 74 and the imaging device 74' of the first and second observers 76, 76 'respectively visualize the same stereoscopic subimages 88, 90 captured by the imaging device , On the other hand, if the viewing angle β = 180 °, the image generating device 74 'of the second observer 76' again visualizes the same images 88, 90 acquired by the image capturing device, which are stereoscopic partial images. However, as shown in Fig. 5B, the images 88, 90 displayed by the image forming device 74 'are interchanged with the images 88, 90 displayed by the image forming device 74 of the observer 76 and digitally mirrored in the horizontal direction displayed. For viewing angles β different from the viewing angles β = 0 ° and β = 180 °, i. E. e.g. For the setting of the second system 10 (FIG. 2) for visualizing the object region 12 shown in FIG. 5C, the image generation device 74 'of the second observation person 76' visualizes a monoscopic image 102 as an observation image of the object region 12. In a modified embodiment of the second system 10 (FIG. 2) it can be provided that the image generating device 74 'of the second observer 76' stereoscopically visualizes the images 88, 90 captured by the image capturing device 16 as stereoscopic partial images, as described above. if the deviation Δβ of the viewing angle β of the second observer 76 'from the viewing angle β = 0 ° or β = 180 ° applies: -5 ° <Δβ <5 ° or -10 ° <Δβ <10 ° or -15 ° < Δβ <15 ° or -20 ° <Δβ <20 ° or -25 ° <Δβ <25 ° or -30 ° <Δβ <30 °. FIGS. 6A and 6B show the acquisition and processing of image data in a further, third system for visualizing the object region 12 with the first and the second image generation device 74, 74 '. Insofar as assemblies and elements in FIGS. 6A and 6B correspond to assemblies and elements from the above-described figures, these are identified by the same numbers as reference symbols. Basically, the function and design of the third system 10 (FIG. 3) for visualizing an object region corresponds to the function and design of the first system 10 (FIG. 1) for visualizing an object region 12. Therefore, only the differences of the third system 10 (3) for visualizing an object area 12 to the first system 10 (1) for visualizing an object area 12 will be explained below. In order to at least partially compensate for the disadvantage of a limited field of view and a reduced resolution for the second observer 76 ', in the third system 10 (FIG. 3) for visualizing the object area 12, the partial area 106 read out by digital windowing and rotation is shown in FIG the image capture device 16 as a stereoscopic partial image captured image 88 comparatively larger. However, the aspect ratio of the partial area 106 is selected according to the aspect ratio of the captured image 88. It is therefore accepted that at the edge of the image display in the imaging device 74 'for the second observer screen areas 108 occur that do not contain image information. However, digital resampling of the image supplied to the display unit 74 'required for the image displayed by the object area visualization display unit 74' to fill the display unit 74 'may be smaller here than in the first and second described above System 10 (1), 10 (2) for visualizing an object area. FIG. 7 shows a partial section of a fourth system 10 (FIG. 4) for visualizing an object area with sensor areas of image sensors as well as a first and a second observation person 76, 76 '. As far as assemblies and elements in FIG. 7 relate to assemblies and elements from the figures described above, these are identified by the same numbers as reference numerals. Basically, the function and design of the fourth system 10 (4) for visualizing an object region 12 corresponds to the function and design of the first system 10 for visualizing an object region 12. Therefore, only the differences of the fourth system 10 (4) for Visualizing an object area 12 to the first system 10 for visualizing an object area 12 is explained. In the fourth system 10 (FIG. 4), the sensor surfaces 68, 68 ', 70, 70', 72, 72 'of the image sensors 42, 42', 44, 44 ', 46, 46' are square. The digitally windowed image 102 of the object area 12, which is displayed with respect to the partial image 88, is displayed on the sensor surface 68, 68 ', 70, 70', 72, 72 'of an image sensor 42, 42 ', 44, 44', 46, 46 'positioned centrally. It has a maximum possible extent. In a modification of the fourth system 10 (FIG. 4), it may be provided that the aspect ratio of the sensor surfaces 68, 68 ', 70, 70', 72, 72 'of the image sensors 42, 42', 44, 44 ', 46 , 46 'deviates from the aspect ratio of the images displayed by the image generator 74. It should be noted that in the fourth system 10 (4) for visualizing an object region, image sensors for capturing images having non-square image formats and image sensors having a quadratic sensor surface for capturing square images are basically usable , It should be noted that, in principle, both the stereoscopic visual impression images of the first observer 76 and the second observer image 76 'in the required, generally rotated, mono perspective are digitally captured from the image sensor by means of an image sensor Picture 88, 90 can be won. This offers the advantage that both the first observer 76 and the second observer 76 'can display nearly the same resolution and nearly the same width of the field of view. FIG. 8 shows a partial section of a fifth system 10 (FIG. 5) for visualizing an object region 12 with sensor surfaces of image sensors as well as a first and a second observation person 76, 76 '. Insofar as assemblies and elements in FIG. 8 correspond to assemblies and elements from the above-described figures, these are indicated by the same numbers as reference symbols. Basically, the function and design of the fifth system 10 (FIG. 5) corresponds to the visualization of an object region 12 of the function and design of the first system 10 for visualizing an object region 12. Hereinafter, only the differences of the fifth system 10 (FIG Visualizing an object area 12 to the first system 10 for visualizing an object area 12 is explained. In the fifth system 10 <5) for visualizing an object region 12, the geometry of the sensor surfaces of the image sensors of the image capture device for the two stereoscopic partial beam paths is different. In particular, the larger of the two sensor surfaces may be configured such that the resolution and the width of the field of view of the image formed in the second image generation unit 74 'are comparable to the resolution and the width of the field of view of the image displayed in the first image generation unit 74. FIG. 9 shows a sixth system 10 (FIG. 6) for visualizing an object region 12. Insofar as assemblies and elements in FIGS. 9 and 10 relate to assemblies and elements from the figures described above, they are the same Numbers identified as reference numerals. Basically, the function and design of the sixth system 10 (6) corresponds to the visualization of an object region 12 of the function and design of the first system 10 (FIG. 1) for visualizing an object region 12. Hereinafter, only the differences of the sixth system 10 (FIG. 6) for visualizing an object area 12 to the first system 10 for visualizing an object area 12. The sixth system 10 (FIG. 6) is designed to visualize an object region 12 by means of fluorescent light. Arranged in the first optical channel 32 is a filter 110 which is permeable only to the light of a spectral range in which a dye excited to fluoresce by means of a light source 112 fluoresces in the object region 12. Fig. 10 shows image sensors 42, 42 'for detecting light in the first and second optical channels 32, 32' of the sixth system 10 (Fig. 6). In the sixth system 10 (FIG. 6), fluorescent light is supplied to the image sensor 42 and white light from the object region 12 is supplied to the image sensor 42 '. The computer unit of the system 10 (FIG. 6) contains an image overlay stage 114. The image overlay stage 114 superimposes the image areas 116, 116 'of the sensor areas of the image sensors 42, 42' digitally to a monoscopic overlay image 88 'of the object area 12. The first image generator 74 for the first observer 76 receives this monoscopic overlay image directly. The second image generating device 74 'is supplied with a monoscopic overlap image 88' processed in the image rotation stage 118. The image rotation stage 118 is for the digitally rotated and windowed display of the image information of the sub-picture 88 'from the image superposition stage 114. For this purpose, in the image rotation stage 118, the signal of the position-determining device 96 is processed to provide the second observer 76' with an image 102 of the Object area 12 corresponding to an orientation corresponding to the position of the vertical projection of the connecting line 92 'of the eyes 78', 80 'of the second observer 76' in the plane perpendicular to an optical axis 22 of the microscope main objective system 20 level. The second observer 76 'is shown as an image of the object area 12, an image 102 with an image section 102' of the sub-picture 88 '. The displayed image 102 is digitally windowed and rotated to the overlay image 88 '. It has an image edge parallel to the perpendicular projection of the connecting line 92 of the eyes 78 ', 80' of the second observer (76 ') into the object area 12. The displayed image 102 contains image information from two optical channels with a different spectral transmission. This measure makes it possible to visualize in the image both structures in the object region 12, which are e.g. are only visible under fluorescent light, as well as structures that are accessible to observation under illumination of the object area 12 with white light. The invention also extends to a system for visualizing an object area in which features of the embodiments described above are combined. In summary, it should be noted that the invention relates to a system 10 (1), 10 (2), 10 (3), 10 (4), 10 (5), 10 (6) for visualizing an object area 12, with a electronic image capture device 16, with an optical assembly 18 having a first optical channel 32 for a object region 12 on a first sensor surface 68, 70, 72 or a plurality of first sensor surfaces 68, 70, 72 of the image capture device 16 imaging the first imaging beam path 30 and a second optical channel 32 'for a second imaging beam path 30' imaging the object region 12 on a second sensor surface 68 'or a plurality of second sensor surfaces 68', 70 ', 72' of the image capture device 16 and which contains a microscope main objective system 20 that is separated from the first imaging beam path 30 and the second imaging beam path 30 'is interspersed with a first image generation device 74 for visualizing the object region 12 for a first B observer 76, which detects a first image 88 of the object region 12 detected on the first sensor surface 68, 70, 72 or the plurality of first sensor surfaces 68, 70, 72 and one on the second sensor surface 68 'or the plurality of second sensor surfaces 68', 70 '. 72 'detected second image 90 of the object region 12 can be supplied, and with a second image generating device 74' for visualizing the object region 12 for a second observer 76 '. For visualizing the object region 12, an image 102 of the object region 12, which contains at least one image section of the first image 88 or of the second image 90, can be supplied to the second image generation device 74 '. LIST OF REFERENCE NUMERALS 10 (1) ... 10 (6) System 12 Object Area 14 Microscope Unit 16 Electronic Image Capture Device 18 Optical Assembly 20 Microscope Main Lens System 22 Optical Axis 24 First Optical System 24 'Second Optical System 26 First Lens Element 28 Second Lens Element 30 First Monoscopic Imaging Beam Path 30 second monoscopic imaging beam path 32 first optical channel 32 'second optical channel 34, 34' optical axis 34 (1), 34 (2) optical axis 36 object plane 38, 38 'holder 40, 40' prism color splitter 42, 42 ', 44 , 44 ', 46, 46' Image sensor 48 first isosceles right-angled prism 50 right-angled prism 52 base 54 side surface
权利要求:
Claims (14) [1] 56 prism square 58 side surface 60 side surface 62, 64 dielectric interference filter 66 base of the prism square 56 68, 68 ', 70, 70', 72, 72 'sensor surface 74, 74' image forming device 76, 76 'observation person 78, 80, 78', 80 ' Eyes 82 Tripod 84 Handle 86 Computer 88,90 Image 88 'Overlay image 92,92' Connecting line 94 Distance line 96 Positioning device 98,98 'Computing unit 100 Position calculation stage 102 Image 102' Image section 104 Basic side 106 Subregion 108 Areas in second observer screen 110 Filter 112 Light source 114 Image Overlay Stage 116, 116 'Image Area 118 Image Rotation Stage 1. Claims 1. System (10 (1), 10 (2), 10 (3), 10 (4), 10 (5), 10 (6) for visualizing an object area (12), having an electronic image capture device (16), with an optical assembly (18) having a first optical channel (32) for a object area (12) on a first sensor surface (68, 70, 72) or a plurality of first sensor surfaces ( 6 8, 70, 72) of the image capture device (16) imaging first imaging beam path (30) and a second optical channel (32) for the object area (12) on a second sensor surface (68 ') or a plurality of second sensor surfaces (68', 70 ' , 72 ') of the image capture device (16) provides imaging second imaging beam path (30') and includes a microscope main objective system (20) interspersed with the first imaging beam path (30) and the second imaging beam path (30 ') a first image generating device (74) for stereoscopically visualizing the object region (12) for a first observer (76) having a first image captured on the first sensor surface (68, 70, 72) or the plurality of first sensor surfaces (68, 70, 72) (88) of the object region (12) and a second image (90) of the object region (12) detected on the second sensor surface (68 ') or the plurality of second sensor surfaces (68', 70 ', 72') can be fed; and second image generating means (74 ') for visualizing the subject area (12) for a second observer (76'), characterized by position determining means (96) for determining the position of the perpendicular projection of the connecting line (92) of the eyes (78 '; , 80 ') of the second observer (76') into the plane perpendicular to an optical axis (22) of the microscope main objective system (20) and a computation unit (98) having a computer program containing an image extracting routine in which the image section corresponding to the determined position of the perpendicular projection of the connecting line (92) of the eyes (78 ', 80') of the second observer (76 ') from the acquired first image (88) of the object region (12) or from the captured second image ( 90) of the object region (12) is determined, wherein the second image generating device (74 ') for the monoscopic visualization of the object region (12) an image (102) of the object region ( 12) containing the at least one image section of the first image (88) or the second image (90). [2] A system (10 (2), 10 (3)) according to claim 1, characterized in that the second image generating means (74 ') for stereoscopically visualizing the object area (12) comprises the captured first image (88) and the captured second image (90) can be supplied when the position determined by the position determining device (96) of the vertical projection of the connecting line (92 ') of the eyes (78', 80 ') of the second observer (76') in the to an optical axis (22 ) of the microscope main objective system (20) perpendicular to the plane away from an optical axis (34) of the first imaging beam path (30) and from an optical axis (34 ') of the second imaging beam path (30') on the object region (12) is parallel to the plane spanned by the microscope main objective system (20) or is at an angle φ to this plane which satisfies the following condition: -15 ° <φ <15 ° or -10 ° <φ <10 ° or -5 ° <φ <5 °, and the second imaging unit for the monoscopic visualization of the object region (12), the image (102) of the object region (12) with the at least one image detail of the first image (88) or the second image (90) determined in the image detail determination routine if the aforementioned angle φ does not satisfy the aforementioned condition. [3] A system according to claim 1 or 2, characterized in that the computer program comprises an evaluation routine for evaluating the position of the perpendicular projection of the connecting line of the eyes (78 ', 80') of the second observer (76 ') determined by the position determining device (96). in the plane perpendicular to an optical axis (22) of the microscope main objective system (20), the second image generator (74 ') captures the captured first image (88) and the captured second image (90) for displaying a stereoscopic image of the first image Object region (12) when the position determined by the position determining device (96) of the vertical projection of the connecting line (92 ') of the eyes (78', 80 ') of the second observer (76') in the optical axis (22) of the microscope main objective system (20) perpendicular to that of an optical axis (34) of the first imaging beam path (30) and of an optical axis (34 ') of the second imaging s beam path (30 ') on the side facing away from the object region (12) side of the microscope main objective system (20) plane is parallel or to this plane is at an angle φ, for which applies: -15 ° <φ <15 ° or -10 ° <φ <10 ° or -5 ° <φ <5 °. [4] A system according to any one of claims 1 to 3, characterized in that the computer program has a routine for extrapolating the at least one image section of the first image (88) or the second image (90) to that of the second image generation device (74 ') for visualization of the object area (12), which has an image format adapted to an image display of the image forming apparatus (74 '). [5] A system according to claim 4, characterized in that the routine for extrapolating the at least one image section of the first image (88) or the second image (90) comprises the at least one image section into one of the second image generation device (74 ') for visualizing the object region (12) transferable image (102) having outside of the first image (88) or the second image (90) lying subregions (106). [6] 6. System according to one of claims 1 to 5, characterized by at least one rectangular sensor surface (68, 70, 72, 68 ', 70', 72). [7] 7. System according to claim 6, characterized in that the at least one rectangular sensor surface is a square sensor surface. [8] A system (10 (5)) for visualizing an object area (12), comprising an electronic image capture device (16), comprising an optical assembly (18) having a first optical channel (32) for the object area (12) on a first Sensor surface (68, 70, 72) or a plurality of first sensor surfaces (68, 70, 72) of the image capture device (16) imaging the first imaging beam path (30) and a second optical channel (32 ') for a the object region (12) on a second sensor surface (68 ') or a plurality of second sensor surfaces (68', 70 ', 72') of the image capture device (16) imaging second imaging beam path (30 ') and which contains a microscope Hauptobjektivsys-tem (20) of the first imaging beam path ( 30) and the second imaging beam path (30 '), with a first image generating device (74) for visualizing the object region (12) for a first observer (76), who is located on the first sensor surface (68, 7 0, 72) or the first plurality of sensor surfaces (68, 70, 72) detected first image (88) of the object region (12) and one on the second sensor surface (68 ') or the plurality of second sensor surfaces (68', 70 ', 72 ') detected second image (90) of the object area (12) can be fed; and second image generation device (74 ') for visualizing the object region (12) for a second observation person (76'), characterized by a spectral transmission of the second optical channel (32 ') different from the spectral transmission of the first optical channel (32). a computer unit (86) having an image overlay stage (114) for superimposing the first image (88) and the second image (90) to a monoscopic overlay image (88 ') of the object region (12), a position determination device (96) for determining the position of the perpendicular projection of the connecting line (92) of the eyes (78 ', 80') of the second observer (76 ') into the plane perpendicular to an optical axis (22) of the microscope main objective system (20), and an image rotation stage (118) for digitally rotating and windowing the monoscopic overlay image (88 ') provided to the second imaging device (74') as an image (102) of the object region (12) for the m onoscopic visualization of the object area (12) supplying a digitally rotated and windowed image to the overlay image (88 ') resulting in the perpendicular projection of the connecting line (92) of the eyes (78', 80 ') of the second observer (76') the object area has parallel image edge. [9] 9. System according to claim 8, characterized in that the computer unit (98) for supplying the monoscopic overlay image (88 ') to the second image generating device (74') is connected to the second image generating device (74 '). [10] 10. Object area visualization method in which a first image (88) of the object area (12) and a second image (90) of the object area (12) in a first optical channel (32) with a first imaging beam path (30) and in a second optical channel (32 ') having a second imaging beam path (30') is detected with optical axes (34, 34 ') forming a stereo angle (a) in the object area (12), and wherein the object area (12) of a first observer (76) and a second observer (76 '), characterized by determining the position of the perpendicular projection of the connecting line (92) of the eyes (78', 80 ') of the second observer (76') in one of optical axis (34, 34 ') vertical plane; determining an image section corresponding to the determined position of the perpendicular projection of the connecting line (92) of the eyes (78 ', 80') of the second observer (76 ') from the detected first image (88) of the object region (12) or from the detected second Image (90) of the object region (12), and the monoscopic visualization of the object region (12) with an image (102) of the object region (12) containing the at least one image section of the first image (88) or the second image (90) , [11] 11. Object area visualization method according to claim 10, characterized by the stereoscopic visualization of the object area (12) to the second observer (76 ') with an image (102) of the object area (12), if the determined position of the vertical projection of the connecting line (92 ') of the eyes (78', 80 ') of the second observer (76') in the plane perpendicular to one of the optical axes (34, 34 ') to that of the optical axis (34) of the first imaging beam path (30) and is parallel to the plane subtended by the optical axis (34 ') of the second imaging beam path (30') or is at an angle φ to this plane which satisfies the following condition: -15 ° <φ <15 ° or -10 ° <φ <10 ° or -5 ° <φ <5 °, and the monoscopic visualization of the object area (12) with the at least one determined image section of the first image (88) or the second image (90) if the aforementioned angle φ does not satisfy the aforementioned condition , [12] Computer program product for providing image data on a computer unit (98, 98 ') for visualizing an object area (12) with an object area visualization method according to claim 10 or claim 11, wherein the computer unit stores the first image (88) of the object area (12). and the second image (90) of the object area (12) is supplied as image data. [13] 13. Object area visualization method, wherein a first image (88) of the object area (12) and a second image (90) of the object area (12) in a first optical channel (32) with a first imaging beam path (30) and in a second optical channel (32 ') having a second imaging beam path (30') is detected with optical axes (34, 34 ') forming a stereo angle (a) in the object area (12), and wherein the object area (12) of a first observer (76) and a second observer (76 '), characterized by the following steps: providing a spectral transmission of the second optical channel (32') different from the spectral transmission of the first optical channel (32), superimposing the first image (88) and the second image (90) to a monoscopic overlay image (88 ') of the object region (12), determining the position of the perpendicular projection of the connecting line (92) of the eyes (78', 80 ') the second n observation person (76 ') in the plane perpendicular to an optical axis (22) of the microscope main objective system (20), and monoscopic visualization of the object region (12) to the second observer (76') by the image information of the subordinate image (88 ' ) is displayed as an image (102 ') of the object area (12) that is digitally windowed and rotated to the sub-picture (88'), and that corresponds to the perpendicular projection of the connecting line (92) of the eyes (78 ', 80') the second observer (76 ') in the object area (12) has parallel image edge. [14] 14. A computer program product for providing image data on a computing unit for visualizing an object area (12) with an object area visualization method according to claim 13, wherein the computer unit comprises the first image (88) of the object area (12) and the second image (90) of the object area (12) are supplied as image data.
类似技术:
公开号 | 公开日 | 专利标题 DE69830459T2|2006-03-23|Stereoscopic image capture device EP2903493B1|2017-08-02|Stereoscopic imaging system DE102014205038B4|2015-09-03|Visualization devices with calibration of a display and calibration methods for display in a visualization device DE602006000627T2|2009-04-02|Three-dimensional measuring method and three-dimensional measuring device DE69628826T2|2004-05-19|LENS SYSTEM FOR A STEREOVIDE VIDEO ENDOSCOPE CH712453A2|2017-11-15|System for the stereoscopic visualization of an object area. CH693619A5|2003-11-14|Image-representative device and method for displaying image. DE102005010390A1|2006-06-08|Transparent camera calibration tool for camera calibration and its calibration procedure DE10204430A1|2003-08-07|Stereo microscopy method and stereo microscopy system DE112015002821T5|2017-03-09|observing system DE69725953T2|2004-09-02|Image display method and device DE69922903T2|2006-01-12|Eye fundus measuring device and storage carrier with eye fundus measuring program DE102014210053A1|2015-12-03|Surgical microscope with data unit and method for overlaying images DE102015208087A1|2016-11-03|Method for generating a reflection-reduced contrast image and related devices DE102008024732B4|2010-04-01|Medical observation device and method for creating a stereoscopic perspective in such a device DE102012106890A1|2014-01-30|Three-dimensional representation of objects DE102015216648B3|2016-11-24|System for the stereoscopic visualization of an object area with three optical channels and a common image sensor DE102017105941B3|2018-05-17|Surgical microscope with an image sensor and a display and method for operating a surgical microscope DE102016117024B4|2021-08-05|Device for capturing a stereo image and method for adjusting the device DE112015006312T5|2017-12-28|Microscopy system, microscopy method and microscopy program DE102018122816A1|2020-03-19|Method and device for determining a property of an object EP0803079A1|1997-10-29|Camera with object lens and image carrier adjusting device and focussing process DE102020118500A1|2022-01-20|Microscope and method for generating an image composed of several microscopic partial images DE102019109147A1|2020-10-08|PICTURE ACQUISITION SYSTEM AND METHOD OF CAPTURING PICTURES AT516878A1|2016-09-15|Method for determining an image of an object
同族专利:
公开号 | 公开日 DE102017109021A1|2017-11-16| US10659752B2|2020-05-19| CH712453B1|2021-10-29| US20170332065A1|2017-11-16|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 DE4212924C2|1991-07-23|2001-11-29|Olympus Optical Co|Stereo microscope| JPH10327373A|1997-05-26|1998-12-08|Mitsubishi Electric Corp|Eyepiece video display| JP4245750B2|1999-10-15|2009-04-02|オリンパス株式会社|Stereoscopic observation device| DE10203215B4|2002-01-28|2004-09-09|Carl Zeiss Jena Gmbh|Microscope, in particular surgical microscope| EP1333305B8|2002-02-04|2007-08-01|Carl Zeiss Surgical GmbH|Stereoscopic examination system, stereoscopic image processing apparatus and operating method| DE102008062650B9|2008-12-17|2021-10-28|Carl Zeiss Meditec Ag|Surgical microscope for observing infrared fluorescence and methods for this| CN102547346A|2009-02-19|2012-07-04|松下电器产业株式会社|Recording medium, playback device| DE102010044502A1|2010-09-06|2012-03-08|Leica Microsystems Ag|Special lighting Video Operations stereomicroscope| EP2426641B1|2010-09-07|2013-04-17|Sony Computer Entertainment Europe Ltd.|System and method of image augmentation| EP3087424A4|2013-12-23|2017-09-27|Camplex, Inc.|Surgical visualization systems| JP2015125502A|2013-12-25|2015-07-06|ソニー株式会社|Image processor, image processing method, display unit, display method, computer program and image display system| US9832449B2|2015-01-30|2017-11-28|Nextvr Inc.|Methods and apparatus for controlling a viewing position| WO2016154589A1|2015-03-25|2016-09-29|Camplex, Inc.|Surgical visualization systems and displays| DE102015216569B3|2015-08-31|2016-07-28|Carl Zeiss Meditec Ag|Time-sequential microscopy system with common observation beam path| DE102015216648B3|2015-08-31|2016-11-24|Carl Zeiss Meditec Ag|System for the stereoscopic visualization of an object area with three optical channels and a common image sensor|CN109983766B|2016-11-24|2021-12-31|株式会社尼康|Image processing device, microscope system, image processing method, and program| US10736506B2|2018-06-14|2020-08-11|Broadspot Imaging Corp|Composite image with cross-channel illumination| DE102018214787A1|2018-08-30|2019-08-08|Carl Zeiss Meditec Ag|Operating microscope for several observers and method for operating the surgical microscope|
法律状态:
2021-03-15| PCOW| Change of address of patent owner(s)|
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 DE102016108750|2016-05-11| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|